Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method

نویسندگان

  • Chenzi Zhang
  • Shuguang Hu
  • Zhihao Gavin Tang
  • T.-H. Hubert Chan
چکیده

We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayes Interval Estimation on the Parameters of the Weibull Distribution for Complete and Censored Tests

A method for constructing confidence intervals on parameters of a continuous probability distribution is developed in this paper. The objective is to present a model for an uncertainty represented by parameters of a probability density function.  As an application, confidence intervals for the two parameters of the Weibull distribution along with their joint confidence interval are derived. The...

متن کامل

A Procedure for Building Confidence Interval on the Mean of Simulation Output Data

One of the existing methods to build a confidence interval (c.i.) for the mean response in a single steady state simulation system is the batch means method. This method, compared to the other existing methods (autoregressive representation, regenerative cycles, spectrum analysis, standardized time series), is quite easy to understand and to implement and performs relatively well. However, the ...

متن کامل

Diffusion Operator and Spectral Analysis for Directed Hypergraph Laplacian

In spectral graph theory, the Cheeger’s inequality gives upper and lower bounds of edge expansion in normal graphs in terms of the second eigenvalue of the graph’s Laplacian operator. Recently this inequality has been extended to undirected hypergraphs and directed normal graphs via a non-linear operator associated with a diffusion process in the underlying graph. In this work, we develop a uni...

متن کامل

Stochastic Subgradient Methods

Stochastic subgradient methods play an important role in machine learning. We introduced the concepts of subgradient methods and stochastic subgradient methods in this project, discussed their convergence conditions as well as the strong and weak points against their competitors. We demonstrated the application of (stochastic) subgradient methods to machine learning with a running example of tr...

متن کامل

A confidence-aware interval-based trust model

It is a common and useful task in a web of trust to evaluate the trust value between two nodes using intermediate nodes. This technique is widely used when the source node has no experience of direct interaction with the target node, or the direct trust is not reliable enough by itself. If trust is used to support decision-making, it is important to have not only an accurate estimate of trust, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017